574 research outputs found

    Adaptive estimation and change detection of correlation and quantiles for evolving data streams

    Get PDF
    Streaming data processing is increasingly playing a central role in enterprise data architectures due to an abundance of available measurement data from a wide variety of sources and advances in data capture and infrastructure technology. Data streams arrive, with high frequency, as never-ending sequences of events, where the underlying data generating process always has the potential to evolve. Business operations often demand real-time processing of data streams for keeping models up-to-date and timely decision-making. For example in cybersecurity contexts, analysing streams of network data can aid the detection of potentially malicious behaviour. Many tools for statistical inference cannot meet the challenging demands of streaming data, where the computational cost of updates to models must be constant to ensure continuous processing as data scales. Moreover, these tools are often not capable of adapting to changes, or drift, in the data. Thus, new tools for modelling data streams with efficient data processing and model updating capabilities, referred to as streaming analytics, are required. Regular intervention for control parameter configuration is prohibitive to the truly continuous processing constraints of streaming data. There is a notable absence of such tools designed with both temporal-adaptivity to accommodate drift and the autonomy to not rely on control parameter tuning. Streaming analytics with these properties can be developed using an Adaptive Forgetting (AF) framework, with roots in adaptive filtering. The fundamental contributions of this thesis are to extend the streaming toolkit by using the AF framework to develop autonomous and temporally-adaptive streaming analytics. The first contribution uses the AF framework to demonstrate the development of a model, and validation procedure, for estimating time-varying parameters of bivariate data streams from cyber-physical systems. This is accompanied by a novel continuous monitoring change detection system that compares adaptive and non-adaptive estimates. The second contribution is the development of a streaming analytic for the correlation coefficient and an associated change detector to monitor changes to correlation structures across streams. This is demonstrated on cybersecurity network data. The third contribution is a procedure for estimating time-varying binomial data with thorough exploration of the nuanced behaviour of this estimator. The final contribution is a framework to enhance extant streaming quantile estimators with autonomous, temporally-adaptive properties. In addition, a novel streaming quantile procedure is developed and demonstrated, in an extensive simulation study, to show appealing performance.Open Acces

    Men and sneakers : the importance of sneakers to the male sneaker enthusiast through function, fashion and personal

    Get PDF
    Professional project report submitted in partial fulfillment of the requirements for the degree of Masters of Arts in Journalism from the School of Journalism, University of Missouri--Columbia

    A natural histone H2A variant lacking the Bub1 phosphorylation site and regulated depletion of centromeric histone CENP-A foster evolvability in Candida albicans.

    Get PDF
    Eukaryotes have evolved elaborate mechanisms to ensure that chromosomes segregate with high fidelity during mitosis and meiosis, and yet specific aneuploidies can be adaptive during environmental stress. Here, we identify a chromatin-based system required for inducible aneuploidy in a human pathogen. Candida albicans utilizes chromosome missegregation to acquire tolerance to antifungal drugs and for nonmeiotic ploidy reduction after mating. We discovered that the ancestor of C. albicans and 2 related pathogens evolved a variant of histone 2A (H2A) that lacks the conserved phosphorylation site for kinetochore-associated Bub1 kinase, a key regulator of chromosome segregation. Using engineered strains, we show that the relative gene dosage of this variant versus canonical H2A controls the fidelity of chromosome segregation and the rate of acquisition of tolerance to antifungal drugs via aneuploidy. Furthermore, whole-genome chromatin precipitation analysis reveals that Centromere Protein A/ Centromeric Histone H3-like Protein (CENP-A/Cse4), a centromeric histone H3 variant that forms the platform of the eukaryotic kinetochore, is depleted from tetraploid-mating products relative to diploid parents and is virtually eliminated from cells exposed to aneuploidy-promoting cues. We conclude that genetically programmed and environmentally induced changes in chromatin can confer the capacity for enhanced evolvability via chromosome missegregation

    Characterising Dependency in Computer Networks using Spectral Coherence

    Get PDF
    The quantification of normal and anomalous traffic flows across computer networks is a topic of pervasive interest in network se- curity, and requires the timely application of time-series methods. The transmission or reception of packets passing between computers can be represented in terms of time-stamped events and the resulting activity understood in terms of point-processes. Interestingly, in the disparate do- main of neuroscience, models for describing dependent point-processes are well developed. In particular, spectral methods which decompose second-order dependency across different frequencies allow for a rich characterisation of point-processes. In this paper, we investigate using the spectral coherence statistic to characterise computer network activ- ity, and determine if, and how, device messaging may be dependent. We demonstrate on real data, that for many devices there appears to be very little dependency between device messaging channels. However, when sig- nificant coherence is detected it appears highly structured, a result which suggests coherence may prove useful for discriminating between types of activity at the network level

    Consistent probabilistic outputs for protein function prediction

    Get PDF
    In predicting hierarchical protein function annotations, such as terms in the Gene Ontology (GO), the simplest approach makes predictions for each term independently. However, this approach has the unfortunate consequence that the predictor may assign to a single protein a set of terms that are inconsistent with one another; for example, the predictor may assign a specific GO term to a given protein ('purine nucleotide binding') but not assign the parent term ('nucleotide binding'). Such predictions are difficult to interpret. In this work, we focus on methods for calibrating and combining independent predictions to obtain a set of probabilistic predictions that are consistent with the topology of the ontology. We call this procedure 'reconciliation'. We begin with a baseline method for predicting GO terms from a collection of data types using an ensemble of discriminative classifiers. We apply the method to a previously described benchmark data set, and we demonstrate that the resulting predictions are frequently inconsistent with the topology of the GO. We then consider 11 distinct reconciliation methods: three heuristic methods; four variants of a Bayesian network; an extension of logistic regression to the structured case; and three novel projection methods - isotonic regression and two variants of a Kullback-Leibler projection method. We evaluate each method in three different modes - per term, per protein and joint - corresponding to three types of prediction tasks. Although the principal goal of reconciliation is interpretability, it is important to assess whether interpretability comes at a cost in terms of precision and recall. Indeed, we find that many apparently reasonable reconciliation methods yield reconciled probabilities with significantly lower precision than the original, unreconciled estimates. On the other hand, we find that isotonic regression usually performs better than the underlying, unreconciled method, and almost never performs worse; isotonic regression appears to be able to use the constraints from the GO network to its advantage. An exception to this rule is the high precision regime for joint evaluation, where Kullback-Leibler projection yields the best performance

    Anticoagulation of cancer patients with nonâ valvular atrial fibrillation receiving chemotherapy: Guidance from the SSC of the ISTH

    Full text link
    Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/150593/1/jth14478.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/150593/2/jth14478_am.pd

    Design and rationale of the B-lines lung ultrasound guided emergency department management of acute heart failure (BLUSHED-AHF) pilot trial

    Get PDF
    Background Medical treatment for acute heart failure (AHF) has not changed substantially over the last four decades. Emergency department (ED)-based evidence for treatment is limited. Outcomes remain poor, with a 25% mortality or re-admission rate within 30 days post discharge. Targeting pulmonary congestion, which can be objectively assessed using lung ultrasound (LUS), may be associated with improved outcomes. Methods BLUSHED-AHF is a multicenter, randomized, pilot trial designed to test whether a strategy of care that utilizes a LUS-driven treatment protocol outperforms usual care for reducing pulmonary congestion in the ED. We will randomize 130 ED patients with AHF across five sites to, a) a structured treatment strategy guided by LUS vs. b) a structured treatment strategy guided by usual care. LUS-guided care will continue until there are ≤15 B-lines on LUS or 6h post enrollment. The primary outcome is the proportion of patients with B-lines ≤ 15 at the conclusion of 6 h of management. Patients will continue to undergo serial LUS exams during hospitalization, to better understand the time course of pulmonary congestion. Follow up will occur through 90 days, exploring days-alive-and-out-of-hospital between the two arms. The study is registered on ClinicalTrials.gov (NCT03136198). Conclusion If successful, this pilot study will inform future, larger trial design on LUS driven therapy aimed at guiding treatment and improving outcomes in patients with AHF

    Shedding light on the elusive role of endothelial cells in cytomegalovirus dissemination.

    Get PDF
    Cytomegalovirus (CMV) is frequently transmitted by solid organ transplantation and is associated with graft failure. By forming the boundary between circulation and organ parenchyma, endothelial cells (EC) are suited for bidirectional virus spread from and to the transplant. We applied Cre/loxP-mediated green-fluorescence-tagging of EC-derived murine CMV (MCMV) to quantify the role of infected EC in transplantation-associated CMV dissemination in the mouse model. Both EC- and non-EC-derived virus originating from infected Tie2-cre(+) heart and kidney transplants were readily transmitted to MCMV-naïve recipients by primary viremia. In contrast, when a Tie2-cre(+) transplant was infected by primary viremia in an infected recipient, the recombined EC-derived virus poorly spread to recipient tissues. Similarly, in reverse direction, EC-derived virus from infected Tie2-cre(+) recipient tissues poorly spread to the transplant. These data contradict any privileged role of EC in CMV dissemination and challenge an indiscriminate applicability of the primary and secondary viremia concept of virus dissemination

    Brain structure and neurocognitive function in two professional mountaineers during 35 days of severe normobaric hypoxia

    Get PDF
    Background and purpose Animal studies suggest that exposure to severe ambient hypoxia for several days may have beneficial long-term effects on neurodegenerative diseases. Because, the acute risks of exposing human beings to prolonged severe hypoxia on brain structure and function are uncertain, we conducted a pilot study in healthy persons. Methods We included two professional mountaineers (participants A and B) in a 35-day study comprising an acclimatization period and 14 consecutive days with oxygen concentrations between 8% and 8.8%. They underwent cerebral magnetic resonance imaging at seven time points and a cognitive test battery covering a spectrum of cognitive domains at 27 time points. We analysed blood neuron specific enolase and neurofilament light chain levels before, during, and after hypoxia. Results In hypoxia, white matter volumes increased (maximum: A, 4.3% ± 0.9%; B, 4.5% ± 1.9%) whilst gray matter volumes (A, −1.5% ± 0.8%; B, −2.5% ± 0.9%) and cerebrospinal fluid volumes (A, −2.7% ± 2.4%; B, −5.9% ± 8.2%) decreased. Furthermore, the number (A, 11–17; B, 26–126) and volumes (A, 140%; B, 285%) of white matter hyperintensities increased in hypoxia but had returned to baseline after a 3.5-month recovery phase. Diffusion weighted imaging of the white matter indicated cytotoxic edema formation. We did not observe changes in cognitive performance or biochemical brain injury markers. Discussion In highly selected healthy individuals, severe sustained normobaric hypoxia over 2 weeks elicited reversible changes in brain morphology without clinically relevant changes in cognitive function or brain injury markers. The finding may pave the way for future translational studies assessing the therapeutic potential of hypoxia in neurodegenerative diseases
    corecore